The Variational Gaussian Approximation Revisited
نویسندگان
چکیده
The variational approximation of posterior distributions by multivariate gaussians has been much less popular in the machine learning community compared to the corresponding approximation by factorizing distributions. This is for a good reason: the gaussian approximation is in general plagued by an Omicron(N)(2) number of variational parameters to be optimized, N being the number of random variables. In this letter, we discuss the relationship between the Laplace and the variational approximation, and we show that for models with gaussian priors and factorizing likelihoods, the number of variational parameters is actually Omicron(N). The approach is applied to gaussian process regression with nongaussian likelihoods.
منابع مشابه
Gaussian Variational Approximate Inference for Generalized Linear Mixed Models
Variational approximation methods have become a mainstay of contemporary Machine Learning methodology, but currently have little presence in Statistics. We devise an effective variational approximation strategy for fitting generalized linear mixed models (GLMM) appropriate for grouped data. It involves Gaussian approximation to the distributions of random effects vectors, conditional on the res...
متن کاملVariational Inference for Sparse Spectrum Approximation in Gaussian Process Regression
Standard sparse pseudo-input approximations to the Gaussian process (GP) cannot handle complex functions well. Sparse spectrum alternatives attempt to answer this but are known to over-fit. We suggest the use of variational inference for the sparse spectrum approximation to avoid both issues. We model the covariance function with a finite Fourier series approximation and treat it as a random va...
متن کاملA Relaxed Extra Gradient Approximation Method of Two Inverse-Strongly Monotone Mappings for a General System of Variational Inequalities, Fixed Point and Equilibrium Problems
متن کامل
Gaussian variational approximation with a factor covariance structure
Variational approximation methods have proven to be useful for scaling Bayesian computations to large data sets and highly parametrized models. Applying variational methods involves solving an optimization problem, and recent research in this area has focused on stochastic gradient ascent methods as a general approach to implementation. Here variational approximation is considered for a posteri...
متن کاملStochastic Variational Inference for Gaussian Process Latent Variable Models using Back Constraints
Gaussian process latent variable models (GPLVMs) are a probabilistic approach to modelling data that employs Gaussian process mapping from latent variables to observations. This paper revisits a recently proposed variational inference technique for GPLVMs and methodologically analyses the optimality and different parameterisations of the variational approximation. We investigate a structured va...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- Neural computation
دوره 21 3 شماره
صفحات -
تاریخ انتشار 2009